Computing marginal likelihoods from a single MCMC output

نویسنده

  • Ming-Hui Chen
چکیده

In this article, we propose new Monte Carlo methods for computing a single marginal likelihood or several marginal likelihoods for the purpose of Bayesian model comparisons. The methods are motivated by Bayesian variable selection, in which the marginal likelihoods for all subset variable models are required to compute. The proposed estimates use only a single Markov chain Monte Carlo (MCMC) output from the joint posterior distribution and it does not require the specific structure or the form of the MCMC sampling algorithm that is used to generate the MCMC sample to be known. The theoretical properties of the proposed method are examined in detail. The applicability and usefulness of the proposed method are demonstrated via ordinal data probit regression models. A real dataset involving ordinal outcomes is used to further illustrate the proposed methodology.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Model Likelihoods and Bayes Factors for Switching and Mixture Models

Papers published in this report series are preliminary versions of journal articles and not for quotations. Abstract In the present paper we discuss the problem of estimating model likelihoods from the MCMC output for a general mixture and switching model. Estimation is based on the method of bridge sampling (Meng and Wong, 1996), where the MCMC sample is combined with an iid sample from an imp...

متن کامل

Improved Estimation of Normalizing Constants From Markov Chain Monte Carlo Output

Normalizing constants of conditional distributions include Bayesian marginal likelihoods and likelihoods of mixture models, such as hierarchical models and state-space time-series models. A promising method for estimating such quantities was proposed by Chib and Jeliazkov (CJ) and improved by Mira and Nicholls using bridge sampling results. Here three additional improvements and one theoretical...

متن کامل

Learning Deep Latent Gaussian Models with Markov Chain Monte Carlo

Deep latent Gaussian models are powerful and popular probabilistic models of highdimensional data. These models are almost always fit using variational expectationmaximization, an approximation to true maximum-marginal-likelihood estimation. In this paper, we propose a different approach: rather than use a variational approximation (which produces biased gradient signals), we use Markov chain M...

متن کامل

The Family of Scale-Mixture of Skew-Normal Distributions and Its Application in Bayesian Nonlinear Regression Models

In previous studies on fitting non-linear regression models with the symmetric structure the normality is usually assumed in the analysis of data. This choice may be inappropriate when the distribution of residual terms is asymmetric. Recently, the family of scale-mixture of skew-normal distributions is the main concern of many researchers. This family includes several skewed and heavy-tailed d...

متن کامل

Phylogenetic Inference via Sequential Monte Carlo

Bayesian inference provides an appealing general framework for phylogenetic analysis, able to incorporate a wide variety of modeling assumptions and to provide a coherent treatment of uncertainty. Existing computational approaches to bayesian inference based on Markov chain Monte Carlo (MCMC) have not, however, kept pace with the scale of the data analysis problems in phylogenetics, and this ha...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2005